Web Survey Bibliography
In this study, we investigate the perspectives of marketing researchers views about the two important concepts of survey research, response rate and response bias. In an attempt to answer the research questions stated, we have collected both primary and secondary data. The primary data was collected from the Academy of Marketing Science active members. Eight versions of an excerpt were taken from an actual article that was accepted for publication recently.
The first treatment was regarding the consideration of population. The subjects were manipulated with two versions of the excerpt of which one was with a Canadian sample and the second was with a so called North American sample. The second treatment was about the manipulation of the initial number of surveys sent out which as a result would change the response rate percentage. The two different versions included 500 vs. 5000 initial surveys sent out varying the response rate from 5.1% to 50.2%. The third treatment included the manipulation of Armstrong and Overton (1977) citation. First version contained a sentence that stated that the early and the late respondents were compared and no significant differences were found as evidence of no response bias including the citation of Armstrong and Overton (1977). The second version of the excerpt included a table with the expected demographics regarding the population of interest. In addition to these, the subjects were also assigned to two different conditions where they were asked to evaluate the excerpt as an author or as a reviewer. In the invitation email respondents were asked to toss a coin or to click on a web link that would toss the coin for them and select the appropriate link that corresponds with their choice. In order to assess the popular techniques of enhancing response rate, we have divided the sample into several groups as the pre-notification, the reminder and the pre-notification and reminder groups and a control group that received neither the reminder nor the pre-notification treatment. The results revealed that, according to our sample none of the techniques mentioned above, improve the response rate.
The secondary data were collected from major outlets of the marketing science (Journal of Marketing-JM, Journal of Marketing Research-JMR, and Journal of the Academy of Marketing Science-JAMS) during the periods of 2005-2010. The final sample consisted of 68 JM, 23 JMR and 84 JAMS articles. In addition to these, we also randomly selected 31 rejected articles from the Journal of Business Research (JBR) archives.
The results of the study revealed that, survey researchers do not clearly grasp the concepts of response rate and response bias. In addition, the results demonstrated that the data quality should be measured by the sample’s representativeness of the population and the researcher’s capability of decreasing the response and the non-response biases. Further, the techniques used to enhance response rate such as reminder and pre-notification letters as well as incentives are not as effective and are likely to introduce additional response bias to a study. The results also showed that the optimal data collection method researchers should consider adopting is the combination method.
Book section
Web survey bibliography - Marketing/business (336)
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Mobile Research im Kontext der digitalen Transformation; 2017; Friedrich-Freksa, M.
- Virtual reality meets sensory research; 2017; Depoortere, L.
- Online customer journey analysis: a data science toolbox; 2017; Bonnay, D.
- Comparing Twitter and Online Panels for Survey Recruitment of E-Cigarette Users and Smokers; 2016; Guillory, J.; Kim, A.; Murphy, J.; Bradfield, B.; Nonnemaker, J.; Hsieh, Y. P.
- Statistical Design for Online Experiments Across Desktops, Tablets, Smartphones (and Maybe Wearable...; 2016; Qian, P.; Sadeghi, S.; Arora, N. K.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in...; 2016; McGonagle, K., Freedman, V. A.
- A look at the unique data-gathering process behind the Harvard Impact Study; 2016; Vitale, J.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk; 2016; Berinsky, A.; Huber, G. A.; Lenz, G. S.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- An Examination of Opposing Responses on Duplicated Multi-Mode Survey Responses; 2016; Djangali, A.
- Scientific Surveys Based on Incomplete Sampling Frames and High Rates of Nonresponse; 2016; Fahimi, M.; Barlas, F. M.; Thomas, R. K.; Buttermore, N. R.
- Adapting Labour Force Survey questions from interviewer-administered modes for web self-completion in...; 2015; Betts, P.; Cubbon, B.
- Internet Panels, Professional Respondents, and Data Quality; 2015; Matthijsse, S.; De Leeuw, E. D.; Hox, J.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- GreenBook Research Industry Trends Report; 2015; Murphy, L. (Ed.)
- The role of gamification in better accessing reality and hence increasing data validity ; 2015; Bailey, P.; Kernohan, H.; Pritchard, G.
- Rewarding the Truth; 2015; Puleston, J.
- Impact of raising awareness of respondents on the measurement quality in a web survey; 2015; Revilla, M.
- Email subject lines and response rates to invitations to participate in a web survey and a face-to-face...; 2015; Sappleton, N.; Lourenco, F.
- Can a non-probabilistic online panel achieve question quality similar to that of the European Social...; 2015; Revilla, M.; Saris, W. E.; Loewe, G.; Ochoa, C.
- Mode Effects in Mixed-Mode Economic Surveys: Insights from a Randomized Experiment; 2015; Hsu, J. W.; McFall, B. H.
- Web-based survey, calibration, and economic impact assessment of spending in nature based recreation; 2015; Paudel, K. P., Devkota, N., Gyawali, B.
- The Influence of Answer Box Format on Response Behavior on List-Style Open-Ended Questions; 2014; Keusch, F.
- Improving Survey Response Rates in Online Panels Effects of Low-Cost Incentives and Cost-Free Text Appeal...; 2014; Pedersen, M. J., Nielsen, C. V.
- Matrix versus paging designs in a brand attribution task; 2014; Conrad, F. G., McCullough, W., Nishimura, R.
- Internet-Based Surveys: Methodological Issues; 2014; Albaum, G., Brockett, P., Golden, L., Han, V., Roster, C. A., Smith, S. M., Wiley, J. B.
- Use of a Google Map Tool Embedded in an Internet Survey Instrument: Is it a Valid and Reliable Alternative...; 2014; Dasgupta, S., Vaughan, A. S., Kramer, M. R., Sanchez, T. H., Sullivan, P. S.
- Sequential or Simultaneous Multi-Mode? Results from Two Large Surveys of Electric Utility Consumers; 2014; Jackson, C., Ledoux, C.
- Targeting the bias – the impact of mass media attention on sample composition and representativeness...; 2014; Steinmetz, S., Oez, F., Tijdens, K. G.
- Exploring selection biases for developing countries - is the web a promising tool for data collection...; 2014; Tijdens, K. G., Steinmetz, S.
- Measuring the very long, fuzzy tail in the occupational distribution in web-surveys; 2014; Tijdens, K. G.
- Moving answers with the GyroScale: Using the mobile device’s gyroscope for market research purposes...; 2014; Luetters, H., Kraus, M., Westphal, D.
- Clicking vs. Dragging: Different Uses of the Mouse and Their Implications for Online Surveys; 2014; Sikkel, D., Steenbergen, R., Gras, S.
- Innovation for television research - online surveys via HbbTV. A new technology with fantastic opportunities...; 2014; Herche, J., Adler, M.
- Online mobile surveys in Italy: coverage and other methodological challenges; 2014; Poggio, T.
- How Sliders Bias Survey Data; 2013; Sellers, R.
- Survey Research Response Rates: Internet Technology vs. Snail Mail ; 2013; Lanier, P. A., Tanner, J. R., Totaro, M. W., Gradnigo, G.
- The impact of New Zealand's 2008 prohibition of piperazine-based party pills on young people'...; 2013; Sheridan, J., Dong, C. Y., Butler, R., Barnes, J.
- How well do volunteer web panel surveys measure sensitive behaviours in the general population, and...; 2013; Erens, B., Burkill, S., Copas, A., Couper, M. P., Conrad, F.
- Effects of Gamification on Participation and Data Quality in a Real-World Market Research Domain ; 2013; Cechanowicz, J., Gutwin, C., Brownell, B., Goodfellow, L.
- Ideal participants in online market research: Lessons from closed communities; 2013; Heinze, A., Ferneley, E., Child, P.
- Online, face-to-face and telephone surveys—Comparing different sampling methods in wine consumer...; 2013; Szolnoki, G., Hoffmann, D.
- Where does the Fair Trade price premium go? Confronting consumers' request with reality; 2013; Langen, N., Adenaeuer, L.
- Customer satisfaction in Web 2.0 and information technology development; 2013; Sharma, G., Baoku, L.
- Research staff and public engagement: a UK study; 2013; Davies, S.